5 - Pattern Recognition [PR] - PR 3 [ID:21820]
50 von 138 angezeigt

Welcome back to pattern recognition. So today we want to review a couple of

basics that are important for the remainder of this class. We will look into

simple classification, supervised unsupervised learning and also review a

little bit of probability theory. So this is a kind of refresher. In case you

are not that strong anymore with probability theory you'll find the

examples that we have in this video very instructive.

So let's go into our pattern recognition basics. So the system for classification

we already reviewed in the previous videos. We have our pattern recognition

system that consists of the pre-processing, the feature extraction and

the classification and you can see that we generally use F to indicate signals

that go as input into the system. Then we want to use G as a kind of pre-processed

image and after the feature extraction we have some abstract features C that

are then used in the classification to predict some class Y. And of course this

is all associated with training samples that we are using in order to learn the

parameters of our problem. So typically these datasets then consist of tuples so

we have some vectors X1 and they are associated to some classes that are

indicated here as Y1 and these tuples then form a training dataset. Now with

this dataset we are then able to estimate the parameters of our desired

classification system. So this is the supervised case and of course there's

also the unsupervised case in which you don't have any classes available but you

just have the mere observations X1, X2 and so on and from this you then can't

tell the class but you know the distribution from the observed samples

and you can start modeling things like clustering and so on. Let's introduce a

bit of notation. So generally we will use here the vector X that is in a

d-dimensional feature space and the vector is typically associated to some

class number so you could either assign it to 0, 1 so this is a two class problem

but you could also indicate classes with minus 1 and plus 1. So for the time

being we will look into two class problems but generally we can also

expand this to multi-class problems and then you could for example use

class numbers or you can also use one-hot encoded vectors in this case then

your class is no longer a simple number or scalar but it will be encoded as a

vector. So if you attended deep learning then you will see that we will use there

this concept quite heavily. What else do we need? Well we want to talk a bit about

probabilities. So here P of Y is the prior probability of a certain class Y

so this is essentially associated with the structure of the problem and the

frequency of that respective class. We will look into a couple of examples in

the next few slides. Then we have some evidence that is generally P of X so

this is the probability of the observation X and this is in the general

case living in a d-dimensional feature space. Furthermore there's the joint

probability so this is essentially the probability of X and Y occurring

together and then there are the conditional probabilities in particular

the class conditional probability which is given as P of X given Y and then

there is the so-called posterior probability that is P of Y given X. So

the posterior is essentially the probability that gives you the likelihood

of a certain class given some observation X. Now this is fairly abstract let's look

into some example how these probabilities are constructed and we'll go

with a very simple example that is a coin flip. So in a coin flip you can have

essentially two outcomes that is heads and tails so here we don't live in a

d-dimensional feature space but instead we only have two different discrete

observations that we have for our observations X. You can see here we did a

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

00:15:12 Min

Aufnahmedatum

2020-10-26

Hochgeladen am

2020-10-26 08:06:55

Sprache

en-US

In this short video, we introduce probability theory, conditional probability, class conditionals, priors, and posteriors.

This video is released under CC BY 4.0. Please feel free to share and reuse.

For reminders to watch the new video follow on Twitter or LinkedIn. Also, join our network for information about talks, videos, and job offers in our Facebook and LinkedIn Groups.

Music Reference: Damiano Baldoni - Thinking of You

Einbetten
Wordpress FAU Plugin
iFrame
Teilen